Conceptualizing Birkhoff's Aesthetic Measure Using Shannon Entropy and Kolmogorov Complexity
نویسندگان
چکیده
In 1928, George D. Birkhoff introduced the Aesthetic Measure, defined as the ratio between order and complexity, and, in 1965, Max Bense analyzed Birkhoff’s measure from an information theory point of view. In this paper, the concepts of order and complexity in an image (in our case, a painting) are analyzed in the light of Shannon entropy and Kolmogorov complexity. We also present a new vision of the creative process: the initial uncertainty, obtained from the Shannon entropy of the repertoire (palette), is transformed into algorithmic information content, defined by the Kolmogorov complexity of the image. From this perspective, the Birkhoff’s Aesthetic Measure is presented as the ratio between the algorithmic reduction of uncertainty (order) and the initial uncertainty (complexity). The measures proposed are applied to several works of Mondrian, Pollock, and van Gogh.
منابع مشابه
Entropy of infinite systems and transformations
The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...
متن کاملInequalities for Shannon Entropy and Kolmogorov Complexity
It was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory 14, 662 664) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropy and for Kolmogorov complexity. It turns out that (1) all linear inequalities that are valid for Kolmogorov com...
متن کاملShannon Entropy vs. Kolmogorov Complexity
Most assertions involving Shannon entropy have their Kolmogorov complexity counterparts. A general theorem of Romashchenko [4] states that every information inequality that is valid in Shannon’s theory is also valid in Kolmogorov’s theory, and vice verse. In this paper we prove that this is no longer true for ∀∃-assertions, exhibiting the first example where the formal analogy between Shannon e...
متن کاملShannon Information and Kolmogorov Complexity
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual information versus Kolmogorov (‘algorithmic’) mutual inform...
متن کاملInformation Distances versus Entropy Metric
Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected val...
متن کامل